Heteroskedastic linear regression: steps towards adaptivity, efficiency, and robustness
نویسندگان
چکیده
In linear regression with heteroscedastic errors, the Generalized Least Squares (GLS) estimator is optimal, i.e., it is the Best Linear Unbiased Estimator (BLUE). The Ordinary Least Squares (OLS) estimator is suboptimal but still valid, i.e., unbiased and consistent. Halbert White, in his seminal paper (Econometrica, 1980) used the OLS residuals in order to obtain an estimate of the standard error of the OLS estimator under an unknown structure of the underlying heteroscedasticity. The GLS estimator similarly depends on the unknown heteroscedasticity, and is thus intractable. In this paper, we introduce two different approximations to the optimal GLS estimator; the starting point for both approaches is in the spirit of White’s correction, i.e., using the OLS residuals to get a rough estimate of the underlying heteroscedasticity. We show how the new estimators can benefit from the Wild Bootstrap both in terms of optimising them, but also in terms of providing valid standard errors for them despite their complicated construction. The performance of the new estimators is compared via simulations to the OLS and to the exact (but intractable) GLS.
منابع مشابه
A Note on the Estimation of Linear Regression Models with Heteroskedastic Measurement Errors
I consider the estimation of linear regression models when the independent variables are measured with errors whose variances differ across observations, a situation that arises, for example, when the explanatory variables in a regression model are estimates of population parameters based on samples of varying sizes. Replacing the error variance that is assumed common to all observations in the...
متن کاملCorrected Maximum Likelihood Estimators in Linear Heteroskedastic Regression Models*
The linear heteroskedastic regression model, for which the variance of the response is given by a suitable function of a set of linear exogenous variables, is very useful in econometric applications. We derive a simple matrix formula for the n biases of the maximum likelihood estimators of the parameters in the variance of the response, where n is the sample size. These biases are easily obtain...
متن کاملDesign Abstraction and Processes in Robotics: From Code-Driven to Model-Driven Engineering
Advanced software engineering is the key factor in the design of future complex cognitive robots. It will decide about their robustness, (run-time) adaptivity, cost-effectiveness and usability. We present a novel overall vision of a model-driven engineering approach for robotics that fuses strategies for robustness by design and robustness by adaptation. It enables rigid definitions of quality-...
متن کاملEstimating censored regression models in the presence of nonparametric multiplicative heteroskedasticity
Powell's (1984, Journal of Econometrics 25, 303}325) censored least absolute deviations (CLAD) estimator for the censored linear regression model has been regarded as a desirable alternative to maximum likelihood estimation methods due to its robustness to conditional heteroskedasticity and distributional misspeci"cation of the error term. However, the CLAD estimation procedure has failed in ce...
متن کاملRobust Estimation in Linear Regression Model: the Density Power Divergence Approach
The minimum density power divergence method provides a robust estimate in the face of a situation where the dataset includes a number of outlier data. In this study, we introduce and use a robust minimum density power divergence estimator to estimate the parameters of the linear regression model and then with some numerical examples of linear regression model, we show the robustness of this est...
متن کامل